Rademacher Margin Complexity

نویسندگان

  • Liwei Wang
  • Jufu Feng
چکیده

where σ1, ...σn are iid Rademacher random variables. Rn(F ) characterizes the extent to which the functions in F can be best correlated with a Rademacher noise sequence. A number of generalization error bounds have been proposed based on Rademacher complexity [1,2]. In this open problem, we introduce a new complexity measure for function classes. We focus on function classes F that is the convex hull of a base function class H , which consists of indicator functions. Hence each f ∈ F is a voting classifier of the form

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Rademacher Complexity Margin Bounds for Learning with a Large Number of Classes

This paper presents improved Rademacher complexity margin bounds that scale linearly with the number of classes as opposed to the quadratic dependence of existing Rademacher complexity margin-based learning guarantees. We further use this result to prove a novel generalization bound for multi-class classifier ensembles that depends only on the Rademacher complexity of the hypothesis classes to ...

متن کامل

Rademacher Complexity Bounds for a Penalized Multiclass Semi-Supervised Algorithm

We propose Rademacher complexity bounds for multiclass classifiers trained with a two-step semi-supervised model. In the first step, the algorithm partitions the partially labeled data and then identifies dense clusters containing κ predominant classes using the labeled training examples such that the proportion of their non-predominant classes is below a fixed threshold. In the second step, a ...

متن کامل

Rademacher Complexity Bounds for Non-I.I.D. Processes

This paper presents the first Rademacher complexity-based error bounds for noni.i.d. settings, a generalization of similar existing bounds derived for the i.i.d. case. Our bounds hold in the scenario of dependent samples generated by a stationary β-mixing process, which is commonly adopted in many previous studies of noni.i.d. settings. They benefit from the crucial advantages of Rademacher com...

متن کامل

Large-Margin Matrix Factorization

We present a novel approach to collaborative prediction, using low-norm instead of low-rank factorizations. The approach is inspired by, and has strong connections to, large-margin linear discrimination. We show how to learn low-norm factorizations by solving a semi-definite program, and present generalization error bounds based on analyzing the Rademacher complexity of low-norm factorizations.

متن کامل

Multi-Class Classification with Maximum Margin Multiple Kernel

We present a new algorithm for multi-class classification with multiple kernels. Our algorithm is based on a natural notion of the multi-class margin of a kernel. We show that larger values of this quantity guarantee the existence of an accurate multi-class predictor and also define a family of multiple kernel algorithms based on the maximization of the multi-class margin of a kernel (MK). We p...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007